Tensor completion in hierarchical tensor representations
نویسندگان
چکیده
Compressed sensing extends from the recovery of sparse vectors from undersampled measurements via efficient algorithms to the recovery of matrices of low rank from incomplete information. Here we consider a further extension to the reconstruction of tensors of low multi-linear rank in recently introduced hierarchical tensor formats from a small number of measurements. Hierarchical tensors are a flexible generalization of the well-known Tucker representation, which have the advantage that the number of degrees of freedom of a low rank tensor does not scale exponentially with the order of the tensor. While corresponding tensor decompositions can be computed efficiently via successive applications of (matrix) singular value decompositions, some important properties of the singular value decomposition do not extend from the matrix to the tensor case. This results in major computational and theoretical difficulties in designing and analyzing algorithms for low rank tensor recovery. For instance, a canonical analogue of the tensor nuclear norm is NPhard to compute in general, which is in stark contrast to the matrix case. In this book chapter we consider versions of iterative hard thresholding schemes adapted to hierarchical tensor formats. A variant builds on methods from Riemannian optimization and uses a retraction mapping from the tangent space of the manifold of low rank tensors back to this manifold. We provide first partial convergence results based on a tensor version of the restricted isometry property (TRIP) of the measurement map. Moreover, an estimate of the number of measurements is provided that ensures the TRIP of a given tensor rank with high probability for Gaussian measurement maps. Holger Rauhut RWTH Aachen University, Lehrstuhl C für Mathematik (Analysis), Templergraben 55, 52062 Aachen Germany, e-mail: [email protected] Reinhold Schneider Technische Universität Berlin, Straße des 17. Juni 136, 10623 Berlin, e-mail: schneidr@math. tu-berlin.de Željka Stojanac RWTH Aachen University, Lehrstuhl C für Mathematik (Analysis), Templergraben 55, 52062 Aachen Germany, e-mail: [email protected] 1 ar X iv :1 40 4. 39 05 v2 [ m at h. N A ] 3 N ov 2 01 4 2 Holger Rauhut, Reinhold Schneider and Željka Stojanac
منابع مشابه
Irreducibility of the tensor product of Albeverio's representations of the Braid groups $B_3$ and $B_4$
We consider Albeverio's linear representations of the braid groups $B_3$ and $B_4$. We specialize the indeterminates used in defining these representations to non zero complex numbers. We then consider the tensor products of the representations of $B_3$ and the tensor products of those of $B_4$. We then determine necessary and sufficient conditions that guarantee the irreducibility of th...
متن کاملNeuron Mathematical Model Representation of Neural Tensor Network for RDF Knowledge Base Completion
In this paper, a state-of-the-art neuron mathematical model of neural tensor network (NTN) is proposed to RDF knowledge base completion problem. One of the difficulties with the parameter of the network is that representation of its neuron mathematical model is not possible. For this reason, a new representation of this network is suggested that solves this difficulty. In the representation, th...
متن کاملOn tensor product $L$-functions and Langlands functoriality
In the spirit of the Langlands proposal on Beyond Endoscopy we discuss the explicit relation between the Langlands functorial transfers and automorphic $L$-functions. It is well-known that the poles of the $L$-functions have deep impact to the Langlands functoriality. Our discussion also includes the meaning of the central value of the tensor product $L$-functions in terms of the Langl...
متن کاملDERIVATIONS OF TENSOR PRODUCT OF SIMPLE C*-ALGEBRAS
In this paper we study the properties of derivations of A B, where A and B are simple separable C*-algebras, and A B is the C*-completion of A B with respect to a C*-norm Yon A B and we will characterize the derivations of A B in terms of the derivations of A and B
متن کاملHierarchical Tucker Tensor Optimization - Applications to Tensor Completion
Abstract—In this work, we develop an optimization framework for problems whose solutions are well-approximated by Hierarchical Tucker (HT) tensors, an efficient structured tensor format based on recursive subspace factorizations. Using the differential geometric tools presented here, we construct standard optimization algorithms such as Steepest Descent and Conjugate Gradient for interpolating ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1404.3905 شماره
صفحات -
تاریخ انتشار 2014